Robots.txt Disallow command [on hold]

Posted by Saahil Sinha on Pro Webmasters See other posts from Pro Webmasters or by Saahil Sinha
Published on 2013-06-23T02:47:18Z Indexed on 2013/06/28 4:29 UTC
Read the original article Hit count: 311

How to disallow folders through Robots.txt, which are been crawled due to wrong url structure, which thus cause duplicate page error

The URL been crawled as incorrectly by Google leading to duplicate page error:

www.abc.com/forum/index.php?option=com_forum

However, The actual correct pages however are:

www.abc.com/index.php?option=com_forum

Is this a correct way by excluding them through robots.txt: To exclude

 www.abc.com/forum/index.php?option=com_forum

Below is command

Disallow: /forum/

Will it not block in legitimate component folder 'Forum' of site?

© Pro Webmasters or respective owner

Related posts about search-engines

Related posts about robots.txt